Temporal Hidden Hopfield Models

نویسندگان

  • Felix Agakov
  • David Barber
  • Felix V. Agakov
چکیده

Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often made to approximate mean field theories, which to date have been applied to models with only simple hidden unit dynamics. We consider a class of models in which the discrete hidden space is defined by parallel dynamics of densely connected high-dimensional stochastic Hopfield networks. For these Hidden Hopfield Models (HHMs), mean field methods are derived for learning discrete and continuous temporal sequences. We discuss applications of HHMs to classification and reconstruction of non-stationary time series. We also demonstrate a few problems (learning of incomplete binary sequences and reconstruction of 3D occupancy graphs) where distributed discrete hidden space representation may be useful. We show that while these problems cannot be easily solved by other dynamic belief networks, they are efficiently addressed by HHMs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Learning in Temporal Hidden Hopfield Models

Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low-dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often made to approximate mean field theories, which to date have been applied to models with only simple hidden unit dynamics. We consider a class of models in which the discrete hidden space is defined...

متن کامل

Transient hidden chaotic attractors in a Hopfield neural system

In this letter we unveil the existence of transient hidden coexisting chaotic attractors, in a simplified Hopfield neural network with three neurons. keyword Hopfield neural network; Transient hidden chaotic attractor; Limit cycle

متن کامل

Learning Symmetry Groups with Hidden Units: beyond the Pergeptron

Learning to recognize mirror, rotational and translational symmetries is a difficult problem for massively-parallel network models. These symmetries cannot be learned by first-order perceptrons or Hopfield networks, which have no means for incorporating additional adaptive units that are hidden from the input and output layers. We demonstrate that the Boltzmann learning algorithm is capable of ...

متن کامل

Networks of spiking neurons can emulate arbitrary Hopfield nets in temporal coding

A theoretical model for analogue computation in networks of spiking neurons with temporal coding is introduced and tested through simulations in GENESIS. It turns out that the use of multiple synapses yields very noise robust mechanisms for analogue computations via the timing of single spikes in networks of detailed compartmental neuron models. In this way, one arrives at a method for emulatin...

متن کامل

Complex-Valued Boltzmann Manifold

These days we can get massive information and it is hard to deal with it without computers. Machine learning is effective for computers to manage massive information. Machine learning uses various learning machine models, for instance, decision trees, Bayesian Networks, Support Vector Machine, Hidden Markov Model, normal mixed distributions, neural networks and so on. Some of them are stochasti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002